A Linesearch Algorithm with Memory for Unconstrained Optimization

نویسندگان

  • Nicholas I. M. Gould
  • Stefano Lucidi
  • Massimo Roma
  • Philippe L. Toint
چکیده

This paper considers algorithms for unconstrained nonlinear optimization where the model used by the algorithm to represent the objective function explicitly includes memory of the past iterations. This is intended to make the algorithm less \myopic" in the sense that its behaviour is not completely dominated by the local nature of the objective function, but rather by a more global view. We present a non-monotone linesearch algorithm that has this feature and prove its global convergence. A linesearch algorithm with memory for unconstrained optimization

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A limited memory adaptive trust-region approach for large-scale unconstrained optimization

This study concerns with a trust-region-based method for solving unconstrained optimization problems. The approach takes the advantages of the compact limited memory BFGS updating formula together with an appropriate adaptive radius strategy. In our approach, the adaptive technique leads us to decrease the number of subproblems solving, while utilizing the structure of limited memory quasi-Newt...

متن کامل

Nonconvex optimization using negative curvature within a modified linesearch

This paper describes a new algorithm for the solution of nonconvex unconstrained optimization problems, with the property of converging to points satisfying second-order necessary optimality conditions. The algorithm is based on a procedure which, from two descent directions, a Newton-type direction and a direction of negative curvature, selects in each iteration the linesearch model best adapt...

متن کامل

A nonmonotone truncated Newton-Krylov method exploiting negative curvature directions, for large scale unconstrained optimization

We propose a new truncated Newton method for large scale unconstrained optimization, where a Conjugate Gradient (CG)-based technique is adopted to solve Newton’s equation. In the current iteration, the Krylov method computes a pair of search directions: the first approximates the Newton step of the quadratic convex model, while the second is a suitable negative curvature direction. A test based...

متن کامل

Exploiting Negative Curvature Directions in Linesearch Methods for Unconstrained Optimization

In this paper we consider the deenition of new eecient linesearch algorithms for solving large scale unconstrained optimization problems which exploit the local nonconvexity of the objective function. Existing algorithms of this class compute, at each iteration, two search directions: a Newton-type direction which ensures a global and fast convergence, and a negative curvature direction which e...

متن کامل

A new hybrid conjugate gradient algorithm for unconstrained optimization

In this paper, a new hybrid conjugate gradient algorithm is proposed for solving unconstrained optimization problems. This new method can generate sufficient descent directions unrelated to any line search. Moreover, the global convergence of the proposed method is proved under the Wolfe line search. Numerical experiments are also presented to show the efficiency of the proposed algorithm, espe...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1998